Learning Adjective Meanings with a Tensor-Based Skip-Gram Model

نویسندگان

  • Jean Maillard
  • Stephen Clark
چکیده

We present a compositional distributional semantic model which is an implementation of the tensor-based framework of Coecke et al. (2011). It is an extended skipgram model (Mikolov et al., 2013) which we apply to adjective-noun combinations, learning nouns as vectors and adjectives as matrices. We also propose a novel measure of adjective similarity, and show that adjective matrix representations lead to improved performance in adjective and adjective-noun similarity tasks, as well as in the detection of semantically anomalous adjective-noun pairs.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Breaking Sticks and Ambiguities with Adaptive Skip-gram

The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skipgram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only a single representation per word. Although a number of Skip-gram modifications were proposed ...

متن کامل

Frame-Based Continuous Lexical Semantics through Exponential Family Tensor Factorization and Semantic Proto-Roles

We study how different frame annotations complement one another when learning continuous lexical semantics. We learn the representations from a tensorized skip-gram model that consistently encodes syntactic-semantic content better, with multiple 10% gains over baselines.

متن کامل

Explaining and Generalizing Skip-Gram through Exponential Family Principal Component Analysis

The popular skip-gram model induces word embeddings by exploiting the signal from word-context coocurrence. We offer a new interpretation of skip-gram based on exponential family PCA—a form of matrix factorization. This makes it clear that we can extend the skip-gram method to tensor factorization, in order to train embeddings through richer higher-order coocurrences, e.g., triples that include...

متن کامل

A Structured Vector Space Model for Hidden Attribute Meaning in Adjective-Noun Phrases

We present an approach to model hidden attributes in the compositional semantics of adjective-noun phrases in a distributional model. For the representation of adjective meanings, we reformulate the pattern-based approach for attribute learning of Almuhareb (2006) in a structured vector space model (VSM). This model is complemented by a structured vector space representing attribute dimensions ...

متن کامل

Learning Context-Sensitive Word Embeddings with Neural Tensor Skip-Gram Model

Distributed word representations have a rising interest in NLP community. Most of existing models assume only one vector for each individual word, which ignores polysemy and thus degrades their effectiveness for downstream tasks. To address this problem, some recent work adopts multiprototype models to learn multiple embeddings per word type. In this paper, we distinguish the different senses o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015